Vector approximate message passing (VAMP) is a computationally simpleapproach to the recovery of a signal $\mathbf{x}$ from noisy linearmeasurements $\mathbf{y}=\mathbf{Ax}+\mathbf{w}$. Like the AMP proposed byDonoho, Maleki, and Montanari in 2009, VAMP is characterized by a rigorousstate evolution (SE) that holds under certain large random matrices and thatmatches the replica prediction of optimality. But while AMP's SE holds only forlarge i.i.d. sub-Gaussian $\mathbf{A}$, VAMP's SE holds under the much largerclass: right-rotationally invariant $\mathbf{A}$. To run VAMP, however, onemust specify the statistical parameters of the signal and noise. This workcombines VAMP with Expectation-Maximization to yield an algorithm, EM-VAMP,that can jointly recover $\mathbf{x}$ while learning those statisticalparameters. The fixed points of the proposed EM-VAMP algorithm are shown to bestationary points of a certain constrained free-energy, providing a variationalinterpretation of the algorithm. Numerical simulations show that EM-VAMP isrobust to highly ill-conditioned $\mathbf{A}$ with performance nearly matchingoracle-parameter VAMP.
展开▼